Search results for "Generalization error"

showing 4 items of 4 documents

Simulated Annealing Technique for Fast Learning of SOM Networks

2011

The Self-Organizing Map (SOM) is a popular unsupervised neural network able to provide effective clustering and data visualization for multidimensional input datasets. In this paper, we present an application of the simulated annealing procedure to the SOM learning algorithm with the aim to obtain a fast learning and better performances in terms of quantization error. The proposed learning algorithm is called Fast Learning Self-Organized Map, and it does not affect the easiness of the basic learning algorithm of the standard SOM. The proposed learning algorithm also improves the quality of resulting maps by providing better clustering quality and topology preservation of input multi-dimensi…

Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniComputer Science::Machine LearningArtificial IntelligenceSOM Simulated annealing Clustering Fast learningArtificial neural networkWake-sleep algorithmbusiness.industryComputer scienceTopology (electrical circuits)computer.software_genreAdaptive simulated annealingGeneralization errorData visualizationComputingMethodologies_PATTERNRECOGNITIONArtificial IntelligenceSimulated annealingUnsupervised learningData miningbusinessCluster analysisSelf Organizing map simulated annealingcomputerSoftware
researchProduct

Adjusted bat algorithm for tuning of support vector machine parameters

2016

Support vector machines are powerful and often used technique of supervised learning applied to classification. Quality of the constructed classifier can be improved by appropriate selection of the learning parameters. These parameters are often tuned using grid search with relatively large step. This optimization process can be done computationally more efficiently and more precisely using stochastic search metaheuristics. In this paper we propose adjusted bat algorithm for support vector machines parameter optimization and show that compared to the grid search it leads to a better classifier. We tested our approach on standard set of benchmark data sets from UCI machine learning repositor…

0209 industrial biotechnologyWake-sleep algorithmActive learning (machine learning)Computer scienceStability (learning theory)Linear classifier02 engineering and technologySemi-supervised learningcomputer.software_genreCross-validationRelevance vector machineKernel (linear algebra)020901 industrial engineering & automationLeast squares support vector machine0202 electrical engineering electronic engineering information engineeringMetaheuristicBat algorithmStructured support vector machinebusiness.industrySupervised learningOnline machine learningParticle swarm optimizationPattern recognitionPerceptronGeneralization errorSupport vector machineKernel methodComputational learning theoryMargin classifierHyperparameter optimization020201 artificial intelligence & image processingData miningArtificial intelligenceHyper-heuristicbusinesscomputer2016 IEEE Congress on Evolutionary Computation (CEC)
researchProduct

Improved SOM Learning using Simulated Annealing

2007

Self-Organizing Map (SOM) algorithm has been extensively used for analysis and classification problems. For this kind of problems, datasets become more and more large and it is necessary to speed up the SOM learning. In this paper we present an application of the Simulated Annealing (SA) procedure to the SOM learning algorithm. The goal of the algorithm is to obtain fast learning and better performance in terms of matching of input data and regularity of the obtained map. An advantage of the proposed technique is that it preserves the simplicity of the basic algorithm. Several tests, carried out on different large datasets, demonstrate the effectiveness of the proposed algorithm in comparis…

SpeedupMatching (graph theory)Wake-sleep algorithmComputer sciencebusiness.industryPattern recognitioncomputer.software_genreAdaptive simulated annealingGeneralization errorComputingMethodologies_PATTERNRECOGNITIONSimulated annealingSOM simulated Annealing TrainingData miningArtificial intelligencebusinesscomputer
researchProduct

A New Min-Max Optimisation Approach for Fast Learning Convergence of Feed-Forward Neural Networks

1993

One of the most critical aspect for a wide use of neural networks to real world problems is related to the learning process which is known to be computational expensive and time consuming.

Mathematical optimizationError functionArtificial neural networkWake-sleep algorithmComputer sciencebusiness.industryConvergence (routing)Process (computing)Feed forward neuralArtificial intelligenceDescent directionbusinessGeneralization error
researchProduct